Labwork 2 : Parameter estimation in signal processing¶
Gabriel Gostiaux¶
Contents¶
- Estimation under additive normal noise
- Exponential case
- Estimator comparison
We consider the relaxation signal of a physical process :
$$ \begin{equation*} s_{clean}=s_{0} e^{-\alpha_{0} i}, i \in[0, N-1] \end{equation*} $$
The objective here is to estimate $\alpha_{0}$ from a noisy signal. We consider successively two types of noises: normal (Gaussian additive) and exponential (multiplicative) noises.
1 Estimation under additive normal noise¶
We first assume that the experiment is perturbed by an additive white normal noise.
- P1 Give the expression of the log-likelihood function $\ell(\alpha)$, then find the expression of the maximum likelihood estimator of $\alpha_{0}$. Can we give an analytical form?
To compute the log-likelihhod, we need to find an expression of the probability law of the noisy signal. Here in the gaussian additive noise, the resulting signal is simply a gaussian signal centered on the true signal and with the variance of the noise.
$$ \begin{align*} P(x_i) &= \frac{1}{\sqrt{2\pi}\sigma} \times exp\left(-\frac{(s_i-s_0 \times exp(-\alpha_0 i))^2}{2\sigma^2}\right) \\ l_\alpha &= -\frac{N}{2}\times log(2\pi\sigma^2)-\frac{1}{2\sigma^2} \sum_{i=0}^{N-1}\left[s_i-s_0 \times exp(-\alpha i)\right]^2 \end{align*} $$
The MLE is then obtained for $ \partial l_\alpha / \partial \alpha = 0 $ and should atteins the Cramer Rao Lower Bound (CRLB) defined by $ \partial^2 l_\alpha / \partial^2 \alpha = 0 $ :
$$ \begin{equation*} \frac{\partial l_\alpha}{\partial \alpha} = \frac{s_0}{\sigma^2}\sum_{i=0}^{N-1}\left[i\times exp(-\alpha_0 i)\times\left(s_i - s_0\times exp(-\alpha_0 i)\right)\right] = 0 \end{equation*} $$
- P2 Give the expression of the Cramer-Rao Lower Bound (CRLB) for the estimation of $\alpha_{0}$. What does the ratio $s_{0} / \sigma$ represent physically?
$$ \begin{equation*} \frac{\partial^2 l_\alpha}{\partial^2 \alpha} = - \frac{s_0}{\sigma^2}\sum_{i=0}^{N-1}\left[i^2\times exp(-\alpha_0 i)\times\left(s_i-2\times s_0\times exp(-\alpha_0 i)\right)\right] \end{equation*} $$
In some cases (which ones?) the expression of the CRLB can be simplified using:
$$ \sum_{n=0}^{+\infty} n^{2} q^{n}=\frac{q(1+q)}{(1-q)^{3}}, q<1 $$
Assuming the mean of the measured signal is the clean signal, we can use the formula.
$$ \begin{equation*} \frac{\partial^2 l_\alpha}{\partial^2 \alpha} = -\frac{s_0^2}{\sigma^2}\sum_{i=0}^{+\infty}\left[i^2\times exp(-2\alpha_0 i)\right] \end{equation*} $$
The CRLB is then equal to :
$$ \begin{equation*} \text{CRLB} = \mathcal{E} \left(-1 / \frac{\partial^2 l_\alpha}{\partial^2 \alpha} \right) = \frac{\sigma^2}{s_0^2} \times \frac{(1-e^{-2\alpha_0})^3}{e^{-2\alpha_0}(1+e^{-2\alpha_0})} \end{equation*} $$
$\rightsquigarrow$ Design a matlab function which gives a realization (length $N=100$ ) of the noisy signal. The arguments will be $s_{0}, \alpha_{0}$ and $\sigma$, the standard deviation of noise. You can choose the parameters as $\alpha_{0}=0.1, s_{0} / \sigma=10$.
$\rightsquigarrow$ Design a function which calculate the log-likelihood function $\ell(\alpha)$ using the previous realization. $\alpha$ should be an input parameter of the function.
# Import necessary modules & libraries
import numpy as np
from math import pi
import plotly.graph_objects as go
from scipy import optimize
import plotly.io as pio
pio.renderers.default = 'notebook' # For classic Jupyter Notebook
# Define global parameters
N = 100
time = [i for i in range(N)]
decay_rate = [i/100 for i in range(N)]
decay = decay_rate[10]
amplitude = 10
noise_std = amplitude / 10
# Define the neccessary functions
def s(x, alpha):
"""Computes the determinist function. (clean signal)
Args:
alpha (scalar): defines the decay rate of the function.
x (list): time serie.
Returns:
list: computed clean signal.
"""
x = np.asarray(x)
clean_signal = amplitude * np.exp(- alpha * x)
return clean_signal
def sNoisy(clean, noise_std):
"""Computes the noisy signal, composed of a determinist "clean signal" and an additive white gaussian noise here.
Args:
clean (ndarray): clean signal to add a noise too.
noise_std (scalar): standard deviation of the gaussian white noise.
Returns:
ndarray: noisy signal.
"""
# Ajouter du bruit gaussien
noise = np.random.normal(0, noise_std, size=clean.shape)
res = np.add(clean, noise)
return res
def logLLH(alpha, noisy, x):
"""Computes the log-likelyhood of a noisy signal for a specific alpha.
Args:
alpha (scalar): decay rate of the determinist "clean signal""Données pour les TP-20240920"
noisy (ndarray): noisy signal.
x (list): time serie.
Returns:
scalar: log-likelyhood for a specific decay rate and for a selected noisy signal.
"""
sigma = np.std(noisy) #scalar
clean = s(x, alpha)
LLH = -N/2 * np.log(2 * pi * sigma**2) - 1/2/sigma**2 * np.sum( np.add( noisy, -clean)**2)
return LLH
# Define ploting functions
def plotSignals(decay):
"""Computes a realization of a noisy gaussian white additive noise and plots it.
Args:
decay (scalar): decay rate for the determinist exponential signal.
Returns:
tuple: clean signal, noisy signal
"""
# Signaux temporels, alpha = 0.1
clean = s(time, decay)
noisy = sNoisy(clean, noise_std)
## Ploting the signals
# Create the plot
fig = go.Figure()
# Add error bars
fig.add_trace(go.Scatter(
x=time,
y=noisy,
mode='markers+lines',
name='Noisy signal',
#error_y=dict(type='data', array=std_devs, visible=True),
marker=dict(size=4)
))
# Add error bars
fig.add_trace(go.Scatter(
x=time,
y=clean,
mode='markers+lines',
name='Clean signal',
#error_y=dict(type='data', array=std_devs, visible=True),
marker=dict(size=2)
))
# Customize layout
fig.update_layout(
title='Noisy and clean signal, for alpha = 0.1',
xaxis_title='Time',
yaxis_title='Signals',
#yaxis=dict(title='LLHs', range=[0, max(LLHs) + 1]),
#xaxis=dict(title='Decay Rate (alpha)', tickvals=decay_rate),
showlegend=True
)
# Show the plot
fig.show()
return clean, noisy
def plotLLH(decay):
# LLHs pour alpha variant.
LLHs = []
clean = s(time, decay)
noisy = sNoisy(clean, noise_std)
for a in decay_rate:
LLHs.append(logLLH(a, noisy, time))
## ploting the log-likelyhood
# Create the plot
fig2 = go.Figure()
# Add error bars
# Add error bars
fig2.add_trace(go.Scatter(
x=decay_rate,
y=LLHs,
mode='markers+lines',
name='Log-likelyhood for one realization',
#error_y=dict(type='data', array=std_devs, visible=True),
marker=dict(size=4)
))
# Customize layout
fig2.update_layout(
title='Log-likelyhood for one realization',
xaxis_title='Decay Rate (alpha)',
yaxis_title='Log-likelyhood',
#yaxis=dict(title='LLHs', range=[0, max(LLHs) + 1]),
#xaxis=dict(title='Decay Rate (alpha)', tickvals=decay_rate),
showlegend=True
)
# Show the plot
fig2.show()
def plotLLH_2(decay, rNumber):
## ploting the log-likelyhood
# Create the plot
fig2 = go.Figure()
# Add error bars
for i in range(rNumber):
clean = s(time, decay)
noisy = sNoisy(clean, noise_std)
LLHs = [logLLH(a, noisy, time) for a in decay_rate]
fig2.add_trace(go.Scatter(
x=decay_rate,
y=LLHs,
mode='markers+lines',
name=f'log-llh for realization no. {i}',
#error_y=dict(type='data', array=std_devs, visible=True),
marker=dict(size=4)
))
# Customize layout
fig2.update_layout(
title='Log-likelyhood for different realization',
xaxis_title='Decay Rate (alpha)',
yaxis_title='Log-likelyhoods',
#yaxis=dict(title='LLHs', range=[0, max(LLHs) + 1]),
#xaxis=dict(title='Decay Rate (alpha)', tickvals=decay_rate),
showlegend=True
)
# Show the plot
fig2.show()
$\rightsquigarrow \operatorname{Plot} \ell(\alpha)$ for $\alpha \in\left[\begin{array}{ll}0 & 1\end{array}\right]$.
$\rightsquigarrow$ Plot again this curve using other realizations.
# Execute the program & plot the results
if __name__ == "__main__":
"""
Execute the program at this stage.
"""
clean, noisy = plotSignals(decay)
plotLLH(decay)
plotLLH_2(decay, 5)
Q1 What are your comments?
One can assume this function of $ \alpha $ admits a maximum. On the figure, we can estimate $ \alpha_0 = 0.1 $.
$\rightsquigarrow$ Design a function which finds the maximum likelihood estimator from a realization. You can use the matlab functions fzero or fminsearch.
# Defines the necessary functions and imports necessary modules & libraries
from scipy import optimize
import sys
sys.setrecursionlimit(2000) # Increase recursion limit (default is usually 1000)
def fmaxLLH(alpha_0, noisy, x):
"""finds the minimum of the log-likelyhood function considered as a function of alpha.
Args:
alpha_0 (scalar): initial guess, comprised in [0,1]
noisy (ndarray): noisy signal considered.
x (list): time serie.
Returns:
scalar: alpha_0 such as the log-likelyhood is minimum.
"""
return optimize.fmin(lambda a: -logLLH(a, noisy, x), alpha_0, disp=0) # wrapper syntax, one could have used : optimize.fmin(logLLH, alpha_0, args = (noisy, x))
# def derivativeLLH(alpha):
# """
# Returns scalar.
# """
# derLLH = np.sum(T * np.exp(- alpha * T) * noise)
# return derLLH
# Execute main program to find the MLE.
if __name__ == "__main__":
MLE = fmaxLLH(decay, noisy, time)
print(f"Decay estimator that minimizes the function: {MLE}")
Decay estimator that minimizes the function: [0.10882813]
$\rightsquigarrow$ Estimate the statistical mean and variance of the estimator with a Monte Carlo simulation.
# Defines the Monte Carlo simulation
def MonteCarlo(K):
L_MLE = []
meanMLE = []
varMLE = []
for k in range(K):
clean = s(time, decay)
noisy = sNoisy(clean, noise_std)
MLE = fmaxLLH(decay, noisy, time)
L_MLE.append(MLE)
meanMLE.append(np.mean(L_MLE))
varMLE.append(np.var(L_MLE))
# Create the plot
fig3 = go.Figure()
# Add error bars
fig3.add_trace(go.Scatter(
x=[k for k in range(K)],
y=meanMLE,
mode='markers+lines',
name='Mean of MLE',
#error_y=dict(type='data', array=std_devs, visible=True),
marker=dict(size=4)
))
# Add error bars
fig3.add_trace(go.Scatter(
x=[k for k in range(K)],
y=varMLE,
mode='markers+lines',
name='Var of MLE',
#error_y=dict(type='data', array=std_devs, visible=True),
marker=dict(size=4)
))
# Customize layout
fig3.update_layout(
title='Mean and var of MLE obtained by Monte Carlo simlation',
xaxis_title='Iteration',
yaxis_title='Mean and Var of MLE',
#yaxis=dict(title='LLHs', range=[0, max(LLHs) + 1]),
#xaxis=dict(title='Decay Rate (alpha)', tickvals=decay_rate),
showlegend=True
)
# Show the plot
fig3.show()
# Computes the Monte Carlo simulation
if __name__ == "__main__":
K = 10000
MonteCarlo(K)
Q2 Is the estimator biased? Does it reach the CRLB?
# Computes the CRLB
def CRLB():
res = noise_std**2 / amplitude**2
sum = 0
for t in time:
sum += t**2 * np.exp(-2 * decay * t)
return res / sum
# sigma = np.std(noisy)
# CRLB = sigma**2 / amplitude**2 * (1-np.exp(-2 * decay))**3 / np.exp( - 2 * decay) /(1+np.exp(- 2 * decay))
CRLB_value = CRLB()
print(f"The CRLB for a decay rate equal to {decay} is {CRLB_value}.")
The CRLB for a decay rate equal to 0.1 is 4.0000285742884185e-05.
Since the variance equals the CRLB, the estimator is efficient, and it is not biased since it does not differ from the exact theorical value as shown by the Monte Carlo simulation.
Useful commands/tips:¶
- fzero: tries to find a zero of a function. The syntax to use is: fzero( $@(x)$ fonc (x, a_1, a_2), x0) to find the zero of the function fonc ( $\alpha$, a_1 , a_2) according to $\alpha$. The a_1, a_2 parameter stay constant during optimization. XO is the initial value.
- fminsearch: attempts to find a local minimizer of a function. The syntax to use is: fminsearch(@(x) fonc(x, a_1, a_2), X0);
2 Exponential case¶
We now consider that the signal (2.1) is pertubed by exponential noise. Remember that for such process (with mean $I$ ) the probability density is :
$$ P(x)=\frac{1}{I} \exp \left[-\frac{x}{I}\right] $$
P3 Express the maximum likelihood estimator of $\alpha_{0}$. Is it the same as the Gaussian case?
The resulting random variable follow an exponential probability law shifted by the determinist signal :
$$ \begin{equation} \begin{aligned} P\left(s_i \mid \alpha\right)&=\frac{1}{s_{\text{clean}}} \exp \left(-\frac{s_i}{s_{\text {clean}}}\right) \\ l_\alpha&=-\sum\ln \left(s_{\text {clean}}\right)-\sum\left(\frac{s_i}{s_{\text {clean}}}\right) \\ \frac{\partial l_\alpha}{\partial \alpha}&=\frac{\partial}{\partial \alpha}\left[-\sum \ln \left(s_0 e^{-\alpha i}\right)-\sum \frac{s_i}{s_0} e^{\alpha i}\right] \\ \frac{\partial l_\alpha}{\partial \alpha}&=\frac{\partial}{\partial \alpha}\left[-N \ln (s_0) + \alpha \sum i -\sum \frac{s_i}{s_0} e^{\alpha i}\right] \end{aligned} \end{equation} $$
This results in :
$$ \begin{equation} \begin{aligned} \frac{\partial l_\alpha}{\partial \alpha} & = \sum i-\sum \frac{s_i}{s_0} i e^{\alpha i} \\ 0 &= \sum i-\sum \frac{s_i}{s_0} i e^{\alpha_{0} i} \\ 0 &= \sum i\left( 1 - \frac{s_i}{s_0} e^{\alpha_{0} i}\right) \end{aligned} \end{equation} $$
P4 Express the CRLB for $\alpha_{0}$ estimation. In order to obtain an analytical form, we can use the result:
$$ \begin{equation} \begin{aligned} \frac{\partial^2 l_\alpha}{\partial^2 \alpha} & =-\frac{1}{s_0} \sum s_i i^2 e^{\alpha i} \\ \left\langle\frac{\partial^2 l_\alpha}{\partial^2 \alpha}\right\rangle & =-\frac{1}{s_0} \sum\left\langle s_i\right\rangle i^2 e^{\alpha i} \\ & =-\sum i^2 \end{aligned} \end{equation} $$
$$ \sum_{n=0}^{N} n^{2}=\frac{N(N+1)(2 N+1)}{6} $$
$$ \begin{equation} C R L B=-\frac{1}{\left\langle\frac{\partial^2 L_\alpha}{\partial^2 \alpha}\right\rangle}=\frac{6}{N(N+1)(2 N+1)} \end{equation} $$
What are the parameters the CRLB depends on?
$\rightsquigarrow$ Design a Matlab function which gives a realization of the stochastic process.
# design the necessary functions to compute the noisy signal and the log-likeyhood.
def sNoisyExp(clean):
"""Computes the noisy signal, composed of a determinist "clean signal" and an additive white gaussian noise here.
Args:
clean (ndarray): clean signal to add a noise too.
noise_std (scalar): standard deviation of the gaussian white noise.
Returns:
ndarray: noisy signal.
"""
# Ajouter du bruit gaussien
noise = np.random.exponential(clean, size=clean.shape)
res = np.add(clean, noise)
return res
def logLLHExp(alpha, noisy, x):
"""Computes the log-likelyhood of a noisy signal for a specific alpha.
Args:
alpha (scalar): decay rate of the determinist "clean signal""Données pour les TP-20240920"
noisy (ndarray): noisy signal.
x (list): time serie.
Returns:
scalar: log-likelyhood for a specific decay rate and for a selected noisy signal.
"""
clean = s(x, alpha)
# LLH = -np.sum(np.log(clean)) - np.sum(noisy / clean)
ratio = np.divide(noisy, clean)
LLH = - N * np.log(amplitude) + alpha * np.sum(x) - np.sum(ratio)
return LLH
# Define ploting functions
def plotSignalsExp(decay):
"""Computes a realization of a noisy gaussian white additive noise and plots it.
Args:
decay (scalar): decay rate for the determinist exponential signal.
Returns:
tuple: clean signal, noisy signal
"""
# Signaux temporels, alpha = 0.1
clean = s(time, decay)
noisy = sNoisyExp(clean)
## Ploting the signals
# Create the plot
fig = go.Figure()
# Add error bars
fig.add_trace(go.Scatter(
x=time,
y=noisy,
mode='markers+lines',
name='Noisy signal',
#error_y=dict(type='data', array=std_devs, visible=True),
marker=dict(size=4)
))
# Add error bars
fig.add_trace(go.Scatter(
x=time,
y=clean,
mode='markers+lines',
name='Clean signal',
#error_y=dict(type='data', array=std_devs, visible=True),
marker=dict(size=2)
))
# Customize layout
fig.update_layout(
title='Noisy and clean signal (exponential white noise), for alpha = 0.1',
xaxis_title='Time',
yaxis_title='Signals',
#yaxis=dict(title='LLHs', range=[0, max(LLHs) + 1]),
#xaxis=dict(title='Decay Rate (alpha)', tickvals=decay_rate),
showlegend=True
)
# Show the plot
fig.show()
return clean, noisy
def plotLLHExp(decay):
# LLHs pour alpha variant.
LLHs = []
clean = s(time, decay)
noisy = sNoisyExp(clean)
for a in decay_rate:
LLHs.append(logLLHExp(a, noisy, time))
## ploting the log-likelyhood
# Create the plot
fig2 = go.Figure()
# Add error bars
# Add error bars
fig2.add_trace(go.Scatter(
x=decay_rate,
y=LLHs,
mode='markers+lines',
name='Log-likelyhood for one realization',
#error_y=dict(type='data', array=std_devs, visible=True),
marker=dict(size=4)
))
# Customize layout
fig2.update_layout(
title='Log-likelyhood for one realization (exponential white noise)',
xaxis_title='Decay Rate (alpha)',
yaxis_title='Log-likelyhood',
#yaxis=dict(title='LLHs', range=[0, max(LLHs) + 1]),
#xaxis=dict(title='Decay Rate (alpha)', tickvals=decay_rate),
showlegend=True
)
fig2.update_yaxes(type="log")
# Show the plot
fig2.show()
def plotLLH_2Exp(decay, rNumber):
## ploting the log-likelyhood
# Create the plot
fig2 = go.Figure()
# Add error bars
for i in range(rNumber):
clean = s(time, decay)
noisy = sNoisyExp(clean)
LLHs = [logLLHExp(a, noisy, time) for a in decay_rate]
fig2.add_trace(go.Scatter(
x=decay_rate,
y=LLHs,
mode='markers+lines',
name=f'log-llh for realization no. {i}',
#error_y=dict(type='data', array=std_devs, visible=True),
marker=dict(size=4)
))
# Customize layout
fig2.update_layout(
title='Log-likelyhood for different realization (exponential white noise)',
xaxis_title='Decay Rate (alpha)',
yaxis_title='Log-likelyhoods',
#yaxis=dict(title='LLHs', range=[0, max(LLHs) + 1]),
#xaxis=dict(title='Decay Rate (alpha)', tickvals=decay_rate),
showlegend=True,
)
fig2.update_yaxes(type="log")
# Show the plot
fig2.show()
# Execute the program & plot the results
if __name__ == "__main__":
"""
Execute the program at this stage.
"""
clean, noisy = plotSignalsExp(decay)
plotLLHExp(decay)
plotLLH_2Exp(decay, 5)
Q3 On which parameters does it depend?
$\rightsquigarrow$ Design a function which finds the maximum likelihood estimator from an experiment. We can use the Matlab functions fzero or fminsearch.
# Defines the necessary functions and imports necessary modules & libraries
def fmaxLLHExp(alpha_0, noisy, x):
"""finds the minimum of the log-likelyhood function considered as a function of alpha.
Args:
alpha_0 (scalar): initial guess, comprised in [0,1]
noisy (ndarray): noisy signal considered.
x (list): time serie.
Returns:
scalar: alpha_0 such as the log-likelyhood is minimum.
"""
return optimize.fmin(lambda a: -logLLHExp(a, noisy, x), alpha_0, disp=0) # wrapper syntax, one could have used : optimize.fmin(logLLH, alpha_0, args = (noisy, x))
# Execute main program to find the MLE.
if __name__ == "__main__":
MLE = fmaxLLHExp(decay, noisy, time)
print(f"Decay estimator that minimizes the function: {MLE}")
Decay estimator that minimizes the function: [0.08792969]
$\rightsquigarrow$ Estimate statistical mean and variance of the estimator with a Monte Carlo simulation.
# Defines the Monte Carlo simulation
def MonteCarloExp(K):
L_MLE = []
meanMLE = []
varMLE = []
for k in range(K):
clean = s(time, decay)
noisy = sNoisyExp(clean)
MLE = fmaxLLHExp(decay, noisy, time)
L_MLE.append(MLE)
meanMLE.append(np.mean(L_MLE))
varMLE.append(np.var(L_MLE))
# Create the plot
fig3 = go.Figure()
# Add error bars
fig3.add_trace(go.Scatter(
x=[k for k in range(K)],
y=meanMLE,
mode='markers+lines',
name='Mean of MLE',
#error_y=dict(type='data', array=std_devs, visible=True),
marker=dict(size=4)
))
# Add error bars
fig3.add_trace(go.Scatter(
x=[k for k in range(K)],
y=varMLE,
mode='markers+lines',
name='Var of MLE',
#error_y=dict(type='data', array=std_devs, visible=True),
marker=dict(size=4)
))
# Customize layout
fig3.update_layout(
title='Mean and var of MLE obtained by Monte Carlo simlation',
xaxis_title='Iteration',
yaxis_title='Mean and Var of MLE',
#yaxis=dict(title='LLHs', range=[0, max(LLHs) + 1]),
#xaxis=dict(title='Decay Rate (alpha)', tickvals=decay_rate),
showlegend=True
)
# Show the plot
fig3.show()
# Computes the Monte Carlo simulation
if __name__ == "__main__":
K = 10000
MonteCarloExp(K)
Q4 Is the estimator biased? Does it reach the CRLB?
Since the variance equals the CRLB, the estimator is efficient, and it is not biased since it does not differ from the exact theorical value as shown by the Monte Carlo simulation.
Useful commands/tips:¶
$\mathrm{R}=\mathrm{randexp}(1, \mathrm{n}, \mathrm{lambda})$ returns pseudo-random values drawn from an exponential distribution with means stored in the vector $(1 \times n)$ lambda. 10 LABWORK 2: Parameter estimation in signal processing (Sept., 23th)
3 Estimator comparison¶
$\rightsquigarrow \quad$ Apply the maximum likelihood estimator adapted to the normal noise to signal pertubed by exponential noise. Plot the estimate for several realizations. Estimate the variance.
Q5 What are your comments?
-> to be continued